Current Issue : January - March Volume : 2013 Issue Number : 1 Articles : 5 Articles
Empirical studies have repeatedly shown that autonomous artificial entities elicit social behavior on the part of the human\r\ninterlocutor. Various theoretical approaches have tried to explain this phenomenon. The agency assumption states that the social\r\ninfluence of human interaction partners (represented by avatars) will always be higher than the influence of artificial entities\r\n(represented by embodied conversational agents). Conversely, the Ethopoeia concept predicts that automatic social reactions are\r\ntriggered by situations as soon as they include social cues. Both theories have been challenged in a 2 Ã?â?? 2 between subjects design\r\nwith two levels of agency (low: agent, high: avatar) and two interfaces with different degrees of social cues (low: textchat, high: virtual\r\nhuman). The results show that participants in the virtual human condition reported a stronger sense ofmutual awareness, imputed\r\nmore positive characteristics, and allocated more attention to the virtual human than participants in the text chat conditions. Only\r\none result supports the agency assumption; participants who believed to interact with a human reported a stronger feeling of social\r\npresence than participants who believed to interact with an artificial entity. It is discussed to what extent these results support the\r\nsocial cue assumption made in the Ethopoeia approach....
This research investigates if a computer and an alternative input device in the form of sensor gloves can be used in the process\r\nof teaching children sign language. The presented work is important, because no current literature investigates how sensor gloves\r\ncan be used to assist children in the process of learning sign language. The research presented in this paper has been conducted\r\nby assembling hardware into sensor gloves, and by designing software capable of (i) filtering out sensor noise, (ii) detecting\r\nintentionally posed signs, and (iii) correctly evaluating signals in signs posed by different children. Findings show that the devised\r\ntechnology can form the basis of a tool that teaches children sign language, and that there is a potential for further research in this\r\narea....
The present study investigated the effects of brief synthesized spoken words with emotional content on the ratings of emotions and\r\nheart rate responses. Twenty participants� heart rate functioning was measured while they listened to a set of emotionally negative,\r\nneutral, and positive words produced by speech synthesizers. At the end of the experiment, ratings of emotional experiences were\r\nalso collected. The results showed that the ratings of the words were in accordance with their valence. Heart rate deceleration\r\nwas significantly the strongest and most prolonged to the negative stimuli. The findings are the first suggesting that brief spoken\r\nemotionally toned words evoke a similar heart rate response pattern found earlier for more sustained emotional stimuli....
We have proposed a tactile geometry display technique based on active finger movement. The technique uses a perceptual feature\r\nthat, during finger movement, the length of a touched object is perceived to increase when the object is moved in the same direction\r\nas the finger movement or to decrease when it is moved in the opposite direction. With this display technique, a wide range of\r\ntactile shapes can be presented with realistic rigid edges and continuous surfaces. In this work, to further develop our technique,\r\nwe performed psychophysical experiments to study perceptions of length and roughness under this presentation technique. The\r\nresults indicated that the elongation (shrinkage) of the object can be observed regardless of the roughness of the touched object\r\nand that the perceived roughness of the object slightly changes but the changes are much smaller than those theoretically expected....
The latest smartphones with GPS, electronic compasses, directional audio, touch screens, and so forth, hold a potential for\r\nlocation-based services that are easier to use and that let users focus on their activities and the environment around them. Rather\r\nthan interpreting maps, users can search for information by pointing in a direction and database queries can be created from GPS\r\nlocation and compass data. Users can also get guidance to locations through point and sweep gestures, spatial sound, and simple\r\ngraphics. This paper describes two studies testing two applications withmultimodal user interfaces for navigation and information\r\nretrieval. The applications allow users to search for information and get navigation support using combinations of point and sweep\r\ngestures, nonspeech audio, graphics, and text. Tests show that users appreciated both applications for their ease of use and for\r\nallowing users to interact directly with the surrounding environment....
Loading....